PAC = PAExact and Other Equivalent Models in Learning

نویسندگان

  • Nader H. Bshouty
  • Dmitry Gavinsky
چکیده

The Probably Almost Exact model (PAExact) BJT02] can be viewed as the Exact model relaxed so that 1. The counterexamples to equivalence queries are distributionally drawn rather than adversarially chosen. 2. The output hypothesis is equal to the target with negligible error (1=!(poly) for any poly). This model allows studying (Almost) Exact learn-ability of innnite classes and is in some sense analogous to the Exact-learning model for nite classes. It is known that PAExact-learnable) PAC-learnable BJT02]. In this paper we show that if a class is PAC-learnable (in polynomial time) then it is PAExact-learnable (in polynomial time). Therefore, PAExact-learnable = PAC-learnable. It follows from this result that if a class is PAC-learnable then it is learnable in the Probabilistic Prediction model from examples with an algorithm that runs in polynomial time for each prediction (polynomial in log(the number of trials)) and that after polynomial number of mistakes achieves a hypothesis that predicts the target with probability 1 1=2 poly. We also show that if a class is PAC-learnable in parallel then it is PAExact-learnable in parallel. Those and other results mentioned in the introduction answer the open problems posed in B97, BJT02].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exploring Learnability between Exact and PAC

We study a model of Probably Exactly Correct (PExact) learning that can be viewed either as the Exact model (learning from Equivalence Queries only) relaxed so that counterexamples to equivalence queries are distributionally drawn rather than adversarially chosen or as the Probably Approximately Correct (PAC) model strengthened to require a perfect hypothesis. We also introduce a model of Proba...

متن کامل

A Booster for the PAExact model

We give a new booster that changes any PAC-learning algorithm to PAExact-learning algorithm that achieves exponentially small error. This booster is much more simpler and more efficient than previous ones and the first that uses deterministic hypotheses. In particular, we show that if a class C is PAC-learnable in polynomial time with constant error and sample size V (for most learnable classes...

متن کامل

On the Relationship between Models for Learning in Helpful Environments

The PAC and other equivalent learning models are widely accepted models for polynomial learnability of concept classes. However, negative results abound in the PAC learning framework (concept classes such as deterministic finite state automata (DFA) are not efficiently learnable in the PAC model). The PAC model’s requirement of learnability under all conceivable distributions could be considere...

متن کامل

A Theoretical Framework for Deep Transfer Learning

We generalize the notion of PAC learning to include transfer learning. In our framework, the linkage between the source and the target tasks is a result of having the sample distribution of all classes drawn from the same distribution of distributions, and by restricting all source and a target concepts to belong to the same hypothesis subclass. We have two models: an adversary model and a rand...

متن کامل

On Learning vs. Refutation

Building on the work of Daniely et al. (STOC 2014, COLT 2016), we study the connection between computationally efficient PAC learning and refutation of constraint satisfaction problems. Specifically, we prove that for every concept class P, PAC-learning P is polynomially equivalent to “random-right-hand-side-refuting” (“RRHS-refuting”) a dual class P∗, where RRHS-refutation of a class Q refers ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002